Skip to main content

Using animation and visualisation in medical education

Neuravatar: using augmented reality in medical education

The understanding of human anatomy is vital to the delivery of healthcare. By drawing on the computer animation and visualisation skills available at BU, the team are developing an online medical teaching platform to help improve medical education and training.

The project

The understanding of human anatomy is vital to the delivery of healthcare. In medical education, this has historically been done through direct dissection of human cadavers by medical students or close observation of such dissection by an anatomist. This helps with the development of a comprehensive understanding of the three-dimensional relationships of the structures of human body when healthly and unwell. The intricate complexity of the human nervous system, combined with the vast range of neurological diseases make this area one of the most challenging for medical education.

Blending medical education and animation

The computer animation and visualisation skills already at BU offer a unique opportunity to develop a suite of tools ready for the influx of medical students using immersive virtual reality and mixed reality techniques. The primary aim of the project is to develop a medical teaching platform that provides an anatomically correct three-dimensional teaching tool using virtual reality and mixed reality such as Microsoft’s HoloLens 2. This will facilitate a deeper understanding of the human body in real individuals. The platform will be developed to allow immersion in a range of clinical scenarios and provide virtual training for students to supplement live clinical experience. This approach will be supported with modular teaching tools and case scenarios derived from real cases and outcomes. Future developments of the platform will include decision support tools, case recording and data analytics tools to support machine learning and personalised actionable analytics.

The project will blend cutting-edge animation and visualisation techniques with digital health approaches with human-centred design principles to provide a platform to train the next generation of healthcare professionals. The project aim is to build on work already undertaken between BU and NHS organisations in Dorset to develop a blended reality platform for undergraduate medical teaching and postgraduate training. Large clinical datasets from existing data repositories in Dorset will be used to train machine learning driven education and decision support tools using supervised learning. Prospective clinical data collection using the de-identification/re-identification pipeline being developed as part of the Dorset informatics strategy will be subjected to unsupervised learning approaches to evaluate and improve accuracy.

Project funder

The project is funded through via the Higher Education Innovation Fund and will run from from April 2020 until July 2022.

Please contact Dr Xiaosong Yang (xyang@bournemouth.ac.uk) or Dr. Rupert Page (Rupert.Page@poole.nhs.uk) if you have any questions about the project.

News from the Faculty of Media & Communication