The response rates for the questionnaire across Year 1 (21%), Year 2 (21%), and Year 3 (15%) were relatively low, which raises questions about the representativeness of the data and the potential for bias. A key concern is that the data may disproportionately reflect the perspectives of students who are already engaged, confident, or motivated to provide feedback. This “keen and friendly” subset may not fully capture the experiences of less engaged or less confident students, who may face the greatest challenges in the areas of inclusion, digital equity, and digital literacy. Understanding why certain groups did not participate is crucial for refining future data collection methods.
Challenges with the Questionnaire Design
The ambition of the questionnaire itself appears to have influenced the quality of responses. The detailed and extensive nature of the survey may have been overwhelming for some students, particularly Year 2 students, who provided no comments despite a reasonable response rate. This could suggest that the questionnaire length or complexity discouraged detailed engagement. For Year 1 and Year 3, shorter surveys or more targeted questions may have allowed for more focused insights, with more comments recorded.
The observation that Year 1 and Year 3 students provided more comments, despite their lower or comparable response rates, suggests that a different design or delivery approach could have been more effective for Year 2. This may have been compounded by practical barriers, such as limited time to complete the survey or unclear instructions.
Delivery Methods and Refinements
The mode of delivery – a printed version versus a version presented digitally or via tutors – could also have affected engagement. A printed version, with opportunities for in-session completion, might have improved response rates and also allowed the questionnaire to reach students who are less proactive in seeking out or completing surveys independently.
Lessons for Future Surveys
- Streamline the Questionnaire: Reducing the number of questions, particularly for Year 2 students, could enhance engagement and completion rates while encouraging more detailed and thoughtful responses.
- Diversify Delivery Methods: Combining printed and digital delivery methods, while leveraging tutor-led sessions, could increase reach and accessibility.
- Address Potential Biases: Future surveys should include strategies to engage less responsive students, such as incentivising participation or directly addressing barriers they might face.
- Timing and Context: Administering surveys at a time when students are less pressured by deadlines and commitments could yield more representative and reflective data.
- Iterative Testing: Piloting the survey with a small group of students, not just colleagues, before wider distribution could help identify and resolve issues with question clarity, length, or accessibility.
In conclusion, the data collected provides valuable insights but is limited by low response rates and possible bias. In the future I will focus on refining the questionnaire design and delivery to engage a broader, more representative sample of students. This will ensure a richer understanding of their experiences and enable more targeted interventions to support inclusion, digital equity, and literacy.
I researched some academic papers regarding low response rates, and found an article ‘Solutions to address low response rates in online surveys’ (Wa’ed Shiyab, Caleb Ferguson, Kaye Rolls, Elizabeth Halcomb, European Journal of Cardiovascular Nursing, Volume 22, Issue 4, May 2023, Pages 441–444, https://doi.org/10.1093/eurjcn/zvad030) which discusses strategies to enhance participation in online surveys, emphasising the importance of well-designed questionnaires, personalized invitations, and regular reminders.
The authors note:
‘low response rates remain a key contributor to bias and the overall quality of results. Having a well-designed survey, … sending personalised invitations, offering regular reminders, and using more than one recruitment strategy are evidence-based approaches to improve response rates.’
‘The role of personalised communication cannot be overstated; tailored messages increase the likelihood of participation by creating a sense of relevance and importance for the respondent.’
‘Using reminders at strategic intervals significantly boosts response rates, with the timing and tone of the reminders playing a crucial role in engaging participants.’
I will bear these points in mind when designing my next questionnaire.
Wa’ed Shiyab, Caleb Ferguson, Kaye Rolls, Elizabeth Halcomb, European Journal of Cardiovascular Nursing, Volume 22, Issue 4, May 2023, Pages 441–444,
www.doi.org/10.1093/eurjcn/zvad030