Promoting Privacy, Security, and Ethics in Immersive Environments, a VRforHealth Interview with MedXRSI

Julia Scott, the Director of the Brain and Memory Care Lab at Santa Clara University and the Executive Lead at MedXRSI answered questions from Denise Silber, VRforHealth

We must build our systems to ensure that our rights are protected on all platforms–physical and virtual.” Julia Scott for MedXRSI

Having previously introduced X Reality Safety Intelligence (XRSI) to the VRforHealth community, through an interview with Valentino Megale, we invite you to learn about Julia A. Scott who joined XRSI as their Executive Lead for the Medical XR Advisory Council (MedXRSI) in 2023.

XRSI, headquartered in the US, is a worldwide non-profit organization that promotes privacy, security, and ethics in immersive environments, including virtual reality. Their mission is “to help build safe and inclusive experiences” by discovering novel cybersecurity, privacy, and ethical risks and proposing potential solutions to mitigate them.”  VRforHealth encourages our community to learn about this mission which contributes to the safety of immersive technologies.

******** 

DS: You are the Director of the Brain and Memory Care Lab at Santa Clara University where you’ve been engaged for over 7 years. How did you come to specialize in this field?
JS: Coming to where I am today was a combination of life experience and lifelong curiosity. I love neuroscience. It always fascinated me that the three-pound mass in our heads seamlessly controls our every thought and action. After two decades of studying the mechanisms and problems of our brain, I became a little disillusioned with the purpose of all these studies and papers and institutes. Coincident with this feeling, I was at the stage of raising my young kids and I had moved to Santa Clara University (SCU) in the Bioengineering department. At SCU, there is a strong foundation in engineering and a new VR studio has recently opened. Facing the challenges of raising a child that was not well-served by the educational and healthcare systems and sparked curiosity by the possibilities in VR, I decided to shift gears in the scope of my research from studying problems to building solutions. Gradually, I built a program of research centered upon neurotechnology and virtual reality. This has been done within the infrastructure of the Healthcare Innovation and Design Program, which serves as a hub for partnerships between industry, faculty and students that approach the complex challenges of healthcare.

DS: How did you connect with XRSI and how does your role at XRSI complement your responsibilities at Santa Clara University ?
JS: XRSI CEO Kavya Pearlman reached out to me through mutual connections for the lead role, mainly because of the nature of my  research and its connection with the mission of MedXRSI. The Medical XR Advisory Council (MedXRSI) is dedicated to fostering responsible innovation, ensuring human safety, and promoting well-being within the realm of immersive technologies and their intersection with healthcare. At this time, MedXRSI was in need of someone who could activate the council around specific initiatives, bring a balanced approach to weighing inputs, and critical analysis of related research. Kavya saw that my history in neuroscience research and my current focus on emerging technologies would position me to fill this role of MedXRSI lead.
Santa Clara University is also positioned to be a helpful partner for the Council. Already, the Markkula Center for Applied Ethics has consulted on relevant reports with their international expertise in healthcare, technology, and internet ethics. Even a former fellow, a surgeon and AI bioethics expert, joined the council. Recently the center published “Ethics in the Age of Disruptive Technologies: An Operational Roadmap,” which serves as a guide for organizational leaders to responsibly utilize emerging technologies. The work with XRSI is complementary to  the activities and mission of SCU across disciplines in Communications and Computer Science as well.

 
DS: What are the 2023 goals of the XRSI Medical Council
JS: We have set specific objectives for this year. We focus on the impact and refrain from becoming all talk and no action. Given there is so much noise in the XR ecosystem,we tend to work quietly and intentionally to fill the gap. Whenever the opportunity presents, we put out acts as a clear ringing bell that cuts through the noise and calls for industry’s attention with a clear purpose to help build a trustworthy and safe Medical XR ecosystem. What we heard from the community of users and creators of Medical XR applications is a need for concrete guidance on how to use and design systems safely. So, this is where our current focus is across these three projects:
1° Baseline Standards for Patient Safety in the Metaverse: Create user-facing guidelines on safety and privacy to pair with Medical XR applications and for public dissemination.
2° Development Guide (MedXR Privacy and Safety Framework) : Expand the general framework to encapsulate the considerations needed for Medical XR applications and Metaverse platforms.
3° Use Case Research Study for Medical XR Ecosystem: Synthesize a two-part report for consumer and clinical contexts, based on surveys, interviews and case studies with organizations, users and developers.


DS: XRSI seeks to educate and protect the public regarding the confidentiality of health data or the quality and security of XR applications.  How does XRSI envisage its relationship with the relevant government agencies nationally and internationally?
JS: XRSI engages with government organizations across the globe to raise awareness of the risks and opportunities of these technologies. And when requested, XRSI also serves as an advisory for organizations on their specific needs. In the healthcare realm, MedXRSI has worked with the United States Federal Drug Administration (US FDA) on the foundations for regulatory guidelines for XR-based software as a medical device. National Health Services – United Kingdom (NHS-UK) has been a key member of the council since the very beginning and we work alongside the NHS in numerous ways, including research exchange and strategic development  for patient education and developer guidance. The trusting relationships we have fostered over the years have led to several global governments and their policy makers seeking out our inputs. We intend to inform and educate leadership to help build the protections that will safeguard the public and foster trust as these new technologies become normalized.


DS: You must be closely following the news of the drop in the minimum age requirement from 13 to 10 for the use of Meta Quest VR headsets. What do you think of this as a neuroscientist and someone concerned with privacy? I am not referring to short supervised sessions for clinical or educational purposes.

JS: We have given considerable thought to this shift that preempted the announcement of Roblox arrival to Quest. There are several dimensions to highlight, as also captured in a recent analysis we published via this article. First the developmental difference of our sensorimotor systems from 10 to 13 years. Younger children have a harder time orienting themselves in virtual spaces as their visual and vestibular system integration is not as mature. We also do not yet know what the implications on vision will be with extended use. We already see the acceleration of myopia onset due to screen usage.
Even deeper than the sense is the perception of reality. VR is a powerful simulation tool that makes us feel like we are present and embodied–not an observer grounded in the physical world. A user of VR must have the top-down control of their attention systems and the life experience to fully discern the memories formed in a simulated space and not. Do we know that ten year-olds around the world are ready for this type of challenge? What will happen when children this age transfer their Roblox time–which averages hours a day–to a VR headset? How will the interactions between users be moderated to prevent psychological harm in an embodied experience? Meta responds to these concerns by delegating consent, supervision, governance responsibility and safety controls to parents and guardians. Roblox is yet to publicly comment on these concerns. Based on the current research of XRSI’s Child Safety Initiative, a more balanced approach, cautious deployment of child-focused platforms and thorough parent education is needed as these changes occur. At the end of the day, the essential need is a framework to ensure trust and safeguard children in these powerful platforms and systems–similar to the one being developed by the XRSI Child Safety Initiative, called the Guardian and Shield Framework (XRSI-GSF).

DS: Which latest developments are you inspired by as concerns XR in healthcare?
JS: There is so much to celebrate in regard to XR for healthcare. AR and VR are transforming physical therapy and rehabilitation. It is not just efficiency and engagement. These platforms are getting effects that could not be seen with traditional practices or medicine. At its core, all these applications harness the power of XR to influence our attention and perceptions to “trick” us into doing what we thought we couldn’t do. There are several companies who have brought their research-backed products to market and are reaching more patients.
On the development of safeguards, I see several governments and the UN working to get ahead of the acceleration of neurotechnology and other biometric sensors in our everyday lives. Chile has introduced legislation on neurorights and the UN has solicited proposals on the impact of neurotechnology with intersecting technologies like XR and AI. I am encouraged by the forethought global organizations and the public are giving to this topic and proud to have contributed to one of the UN Advisory for Human Rights and Neurotechnologies. Our relationship with technology is one of dependency. We must build our systems to ensure that our rights are protected on all platforms, beyond medical use, whether physical and virtual.