Talking Mimes VR

Kia ora. Justin Kennedy Good mentioned that there was some discussion around disability and VR in the round table yesterday. My husband and I created this open-access VR experience + workshop toolkit. The experience is shot from the perspective of someone with locked-in syndrome (cannot move or communicate), and the workshop helps people reframe unhelpful thoughts and behaviours about disabled people.

You can access everything here on the website https://talkingmimes.com
Reach out to me if you’ve got any questions :slight_smile:


A touching, funny, infuriating, harrowing and ultimately hopeful VR experience, drawn from true stories of people with profound physical disabilities. An emotional education piece with companion workshops developed for everyone.

There is one for dementia out of the UK that I use in some of my teaching.

opengraphobject:[360687454191616 : https://www.awalkthroughdementia.org/ : title=“A Walk Through Dementia” : description=“A Walk Through Dementia from Alzheimer’s Research UK is a free Android & iOS Virtual Reality app giving an insight into life with dementia. Take a walk through dementia.”], opengraphobject:[360687454199808 : https://www.awalkthroughdementia.org : title=“A Walk Through Dementia” : description=“A Walk Through Dementia from Alzheimer’s Research UK is a free Android & iOS Virtual Reality app giving an insight into life with dementia. Take a walk through dementia.”]

Also I am working on simulating visual field loss using data from actual field tests ultimately with VR. There is an open platform called openvissim by Pete Jones a lecturer at UCL.

This is going down a more commercial route at the moment but he is consulting on my work if anyone wants an introduction.

Last I spoke with Pete he was working on eye tracking with VR.

opengraphobject:[360687454658560 : GitHub - petejonze/OpenVisSim: An open-source, data-driven, gaze-contingent visual impairment simulator, for VR/AR : title=“GitHub - petejonze/OpenVisSim: An open-source, data-driven, gaze-contingent visual impairment simulator, for VR/AR” : description=“An open-source, data-driven, gaze-contingent visual impairment simulator, for VR/AR - GitHub - petejonze/OpenVisSim: An open-source, data-driven, gaze-contingent visual impairment simulator, for VR/AR”], opengraphobject:[360687454666752 : GitHub - petejonze/OpenVisSim: An open-source, data-driven, gaze-contingent visual impairment simulator, for VR/AR : title=“GitHub - petejonze/OpenVisSim: An open-source, data-driven, gaze-contingent visual impairment simulator, for VR/AR” : description=“An open-source, data-driven, gaze-contingent visual impairment simulator, for VR/AR - GitHub - petejonze/OpenVisSim: An open-source, data-driven, gaze-contingent visual impairment simulator, for VR/AR”]

@sarv great idea. I use VR/AR in my teaching a bit and we designed a handover / exploration of other disciplines from Physio, paramedic to nurse at hospital for triage a while back as another application.

My PhD used VR to measure length in children (early in ARKit evolution - it was reseated half way through my PhD) happy to share if you want @sarv and @Justin_Kennedy-Good_ADHB
opengraphobject:[360687455641600 : https://openrepository.aut.ac.nz/handle/10292/12234 : title=“Authentic Interprofessional Health Education Scenarios Using Mobile VR” : description=“”], opengraphobject:[360687455649792 : https://openrepository.aut.ac.nz/handle/10292/12234 : title=“Authentic Interprofessional Health Education Scenarios Using Mobile VR” : description=“”]