Metrics that Matter - What Data Should an L&D Team Track

Metrics that Matter - What Data Should an L&D Team Track

Subscribe to the L&D Toolbox

The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of Ausmed Education.


This article is part one of a three-part series by Kath Sharples on Metrics that Matter in healthcare.

Introduction

I wasn’t prepared for the state of unconscious incompetence that I found myself as a brand-new nurse educator in 2004. There I was, two-thirds of the way through a Master of Learning and Teaching in Healthcare and full of vim, ready to educate the masses towards a bright future of nurse-led quality care. Bless my naive little heart. The learning pathway that had led me to the title of ‘Learning Community Education Advisor’ (fancy, I know), had not entirely prepared me for the somewhat brutal reality of healthcare education. My ability to apply adult learning theory to the practice of facilitating learning wasn’t the problem. While my skills were fledgling, they were nonetheless adequate. No, my biggest shock was discovering that education just wasn’t valued out there in practice. Especially the continuing professional development (CPD) of nurses. Boom.

I can still feel that stomach-churning state of cognitive dissonance the first time I realised that the best way to turn ‘management’ off my education rhetoric was to mention the relationship between lifelong learning and quality care. I literally watched eyes glaze over while I extolled the benefits of professional development. I concluded that the synergy between learning and improved standards of care wasn’t being prioritised by managers and executives across the health, disability, or aged care nursing workforce. ‘What is wrong with these people’ I thought, these so-called health care managers who constantly talk of financial constraints instead of the essential things like patient/resident care. I openly criticised any member of an executive team or organisational leader who did not align with my point of view regarding the value of CPD. How ignorant could these people be I raged, look at all the evidence they are choosing to ignore.

The Value of Data in Education

If you wear a Learning and Development (L&D) hat and are employed in the Australian health, aged, or disability care sector, I’m confident you will know exactly what I’m talking about here. I know that you will have extolled the value of education programs and will have tirelessly championed the relationship between learning and quality of care. You will have diligently issued surveys before, during and after every single online program or face-to-face study day and have mountains of paper all with the 'strongly agree' box ticked. You may even have turned your surveys into multicoloured column charts and referenced studies demonstrating the association between increased nurse education and decreased hospital mortality (Aiken et al., 2014), studies that highlight CPD as essential to the provision of person-centred, safe and effective care (King, et al., 2021), studies that establish the importance of investing in the professional development of nurse leaders for safe patient services (Todd, et al., 2024). I have no doubt that your L&D team has presented a plethora of data and evidence-based literature only to see your efforts repeatedly fail the garner much enthusiasm. Perhaps you and your team feel worn down, burnt out and ready to quit either quietly or loudly.

What you may not have considered is that while your argument is correct, the real evidence you need to persuade management to invest in learning is unlikely to be found in yet more statistics regarding the relationship between education and quality care. This is not in question, and I will always be on the side of King et al. (2021) who remain adamant that there needs to be an investment in the development of nurses and other members of the interprofessional healthcare team at all stages of their careers if the healthcare workforce is to be fully prepared and skilled to provide high quality, transformational, preceptorship and leadership. However, our tendency to focus on the qualitative benefits of learning often means that we put little or no energy into quantitative evidence, for example, demonstrating that education delivers a positive return on investment (Opperman et al., 2018). The issue isn’t so much ‘us and them’; instead, the problem is that we may be failing to provide the type of metrics and data points that showcase the cost-saving rather than cost-draining attributes of education within an organisation.

Reframing Evaluation

The contributions and attributes of nurses as educators, administrators, communicators, statisticians, and environmental activists are well documented. These attributes were first established by Florence Nightingale, a passionate statistician who conducted extensive research and analysis (Peate, 2019). The problem is that her skill in mathematics, statistics, and data visualisation (Bradshaw, 2020) wasn’t the popular image of nurses that the public craved, resulting in this talented scientist now remembered as a lamp-carrying nurse in the Crimean War.

It seems this image of nursing remains the default, most recently evidenced during the COVID-19 pandemic when so much social commentary was focused on the image of the heroic nurse, reflected in the media's use of virtuous statements about brave and tireless healthcare providers (Gündüz Hoşgör & Coşkun, 2024). If we are guilty of anything, it’s that we’ve forgotten what it is to think like a nurse. We’ve always been an art and a science; we’ve always known the qualitative and quantitative sides are equally responsible for telling the whole story. So why, when it comes to building our case for education, do we often fall far short when it comes to identifying, analysing, and reporting the whole story with compelling qualitative and quantitative measures of success?

I’m all for the art, but if we as educators really want to be heard, then we must adopt a more scientific approach when it comes to evaluating our education programs. As a starting point, evaluation needs to consider three fundamental questions. Firstly, what would my organisation consider success to look like, secondly, what evidence is available for me to measure this success, and thirdly, what are the easiest and most compelling options for presenting these measures of success? We need to stop fooling ourselves that an evaluation survey where participants strongly agree that they liked the temperature of the room or freeball comments complaining about the lack of gluten-free options at morning tea will be of any value. Let’s just agree that as these types of feedback surveys have not worked to date; they should be excluded from any reported metrics if you want your L&D team to be taken seriously. Marks-Maran (2015) describes this approach to educational evaluation as equivalent to asking learners, What did you think of the show?, a quality assurance activity with little value as it focuses primarily on the impact on the learner. Instead, we are advised to undertake an evaluative research approach, adopting a rigorous process of multi-modal data collection and analysis, perhaps even incorporating systematic research methodologies, tested tools, or ethical approval (Marks-Maran, 2015).

Finding an Evaluation Model

Wait a minute, I hear you cry. We have no time for research, and even if we had the time, we have no resources. This is why I’m certainly not advocating that all educational evaluation should become a randomised, double-blind control trial. What I will suggest, however, is that if you want to stop the ‘us and them’ roundabout, you will need to adopt more of a research mentality when it comes to presenting your evidence of success in relation to education. The starting point is to find a robust and evidence-based model that will not only allow you to categorise your evidence but also report it against measures that your organisation considers important. Personally, I have found the fourfold model of engagement, value, impact, and sustainability the most useful in guiding my own practice Marks-Maran (2015), with each element consistently relevant within my context of education.

Exploring Metrics that Matter

In this series of ‘Metrics that Matter - What Data Should an L&D Team Track’, I will be exploring the intricate relationship between education initiatives and their outcomes using examples of qualitative and quantitative measures of success. The second article in this series will be dedicated to the themes of engagement and sustainability, while the third will focus on evaluative measures of value and impact. As a L&D professional, you can expect to be provided with practical, realistic, and relevant data points that can provide not only evidence of the value of educators within the organisation but also the measures of success that can be used to articulate the value that education (you, and your team) bring to an organisation.

References

Aiken, L. et al., 2014. Nurse staffing and education and hospital mortality in nine European countries: a retrospective observational study. Lancet, 383(9931), pp. 1824-1830. Link to Article

Bradshaw, N., 2020. Florence Nightingale (1820–1910): An Unexpected Master of Data. Patterns, 1(2). Link to Article

Gündüz Hoşgör, D. & Coşkun, F., 2024. Turkish Society’s Perception of Nursing Image during the COVID-19 Pandemic. BMC Nursing, 23(1), pp. 1-8. Link to Article

King, R. et al., 2021. Factors that optimise the impact of continuing professional development in nursing: A rapid evidence review. Nurse Education Today, Volume 98, p. 104652. Link to Article

Marks-Maran, D., 2015. Educational research methods for researching innovations in teaching, learning and assessment: The nursing lecturer as researcher. Nurse Education in Practice, Volume 15, pp. 472-479. Link to Article

Opperman, C., Liebig, D., Bowling, J., et al. 2018. Measuring Return on Investment for Professional Development Activities: 2018 Updates. Journal for Nurses in Professional Development, 34, pp. 303-312. Link to Article

Peate, I., 2019. Nursing is an art and a science: the year of the nurse and midwife. Journal of Aesthetic Nursing, 8(10). Link to Article

Todd, D., Deal, J. & Parker, C., 2024. Virginia Henderson Institute of Clinical Excellence Nurse Leader Academy: An Innovative Approach to Nurse Leadership Development. Nurse Leader, 22(1), pp. 66-72. Link to Article

Author

Kath Sharples - Health Education Consultants Australia

Kath Sharples  

Kath Sharples is a specialist educator, her expertise is in work-based professional development and operational/strategic education leadership across private and public healthcare organisations, higher education, and aged care.

She has international experience in the strategic planning and delivery of innovative approaches to continuing professional development and the translation of evaluative research into evidence-based best practice. Kath founded Health Education Consultants Australia (HECA) in 2017. She is a Fellow of the Higher Education Academy (UK).