Our website uses cookies. See our cookies page for information about them and how you can remove or block them. Click here to opt in to our cookies

Digital technology and dance: Does motion capture offer an alternative to film?

Julia is a dance artist who lives and works in the Highlands. She collaborates with other artists and her local community to create live performance, dance for film and participatory dance events.

Fraser is a 3D animation artist, also based in Highlands. He creates 3D graphics, visual effects, motion graphics and architectural visualisation. He collaborates with theatre companies, community groups and teaches 3D animation to young people.

With support from the See Learn Share fund, they have begun a new collaboration together and carried out the background work for a six month proof of concept project called Perception Experiments, which aims to use motion capture technology to map the movement qualities of young people who have learning and physical disabilities.

Julia explains how the idea for the project originated, the learning she has gained from working with Fraser and their ambitions for future collaboration.


My interest in using motion capture was sparked back in April 2018 during a Y Dance workshop led by Benjamin Dunks, then Artistic Director of Attik Dance.

Ben had used motion capture technology called Perception Neuron as a training tool with his young company, allowing them to see their progress in dance technique without being distracted by their own appearance. It seemed to me that motion capture could – unlike film - offer the possibility of recording a person’s movement without necessarily recording their identity.

This question around identity had been provoked by a conversation around informed consent that had come up during my Imaginate at The Workroom residency back in October 2017. The residency was part a project called Interactive Experiments – a collaboration involving three artists and several young people, most of whom have Autism Spectrum Disorder and one who has Angelman Syndrome, a developmental delay disorder. You can see some footage from that project on Vimeo.


After looking into an affordable motion capture kit called Perception Neuron, I decided to find a collaborator who could bring the technical knowledge required to create 3D animations from motion capture data.

I met Fraser through Eden Court Engagement’s Digital Team and from our first meeting the ideas flowed for a new project using motion capture and digital avatars to collaborate with the young people that I had met during Interactive Experiments.

Over the course of our meetings, we looked into the practicalities of the equipment and software, using information available online such as the Perception Neuron demo software.

Fraser ran tests to import motion capture data files into Blender 3D animation software to ensure this worked smoothly. We also discussed the ways we could gather input from the young people so that they would shape their own dance avatars.


Our discussions have given me a greater understanding of the work flow from live dance to captured data to avatar to animation, which has been crucial in helping me create a realistic timeline and budget for our six month pilot project.

I’ve learned about the new version of Blender, which will be launched in the next few months, that includes a real-time render engine called Eevee. This allows animators to watch their animations as soon as they are created, without having to wait for them to be rendered.

We talked about the possibility of using this tool within our project to allow the young people to see their own avatars moving in relation to their motion capture selves in real time.  This is quite ambitious but not out of the question and could be a way to use motion capture in a live performance setting, where a dancer wearing the kit could respond to their avatar on screen in the moment.


During our time on See Learn Share, Fraser and I made a video to explain how we want to take forward our collaboration and demonstrate how motion capture can be used to map movement onto avatars. You can watch this video online at juliamcghee.co.uk/dance-in-schools/interactive-experiments/the-blog.