Skip Navigation

Key Privacy Considerations for Launching Your Presence in the Metaverse

By Ellie Marshall and Ronak Shah
November 30, 2022

Over the past few years, several technology companies have touted the potential of new interactive and immersive augmented reality (AR) or virtual reality (VR) worlds which individuals can join to socialize, learn, work, game, and importantly, shop. From promoting products, to selling goods, to providing services, the commercial opportunities in the metaverse could be many.

This is the second in a two-part series that considers the unique legal issues that the metaverse might raise. In the first part of the series, we focused on the litigation challenges posed by the metaverse, with a focus on privacy and product liability litigation.

In this bulletin, we highlight from a Canadian perspective some key questions organizations looking to establish a presence in a metaverse platform should be considering. As a multiverse of competing “metaverses” emerges, all types of organizations may be considering how to take part in this type of digital economy. However, just as with any new data driven offering, organizations should think carefully and critically about how data associated with these initiatives will be governed, used and shared (including between and by the different metaverse platforms). If an organization’s participation in a metaverse results in its collection, use or disclosure of personal information, privacy laws will apply to the organization, even if it is not the entity hosting or facilitating the metaverse.

What is the purpose of joining a metaverse?

While it is unlikely that one single unified place called “the metaverse” will emerge, virtual AR & VR metaverses are positioned as new worlds for interacting with customers in a wide range of settings. Services could be offered in a VR version of a traditional retail store, physical goods could be ordered through an AR portal at the point of use, or digital goods such as NFTs could be sold or traded in an unrecognizable mythical gaming universe. Regardless of how other-worldly these developments could be, it is important for organizations to clearly identify why they are joining a metaverse.

Canadian data protection laws are based on fair information principles. These principles set the “ground rules” for the collection, use, retention and disclosure of personal information. Further, the federal Personal Information Protection and Electronic Documents Act (PIPEDA) provides that any collection, use, or disclosure of personal information must only be for purposes that a reasonable person would consider appropriate in the circumstances. Accordingly, prior to establishing a metaverse presence, businesses should evaluate the appropriateness of their proposed personal information processing purposes. Business should assess whether their practices are not off-side the Office of the Privacy Commissioner of Canada’s Guidance on inappropriate data practices. Furthermore, businesses should undertake robust due diligence to understand whether sensitive personal information (e.g., biometric data such as facial movements or gaze tracking) is processed or whether high-risk technologies (e.g., emotion/behavioural analysis technologies) may be engaged. Recently, the U.K. Information Commissioner’s Office noted that prior to deploying high-risk AI technologies, organizations should assess whether they are sufficiently developed to undertake the types of emotion/behavioural analysis they purport to undertake and that they do not pose risks to vulnerable populations, raise issues of accuracy or could lead to discrimination.  

If your organization is considering launching a presence in a metaverse, consider engaging in a privacy impact assessment or reviewing your privacy program and data flow inventories to ensure that the activities you are conducting through a metaverse are captured. For example, if your organization offers a customer service tool through a metaverse, information about how the customer’s data is collected and used should be included in the organization’s privacy statements. Organizations should have processes in place to proactively evaluate and monitor the impact of their personal information processing activities in the metaverse especially since Canadian privacy and artificial intelligence law is currently in flux. For more see our recent Blakes Bulletin on Bill C-27.

How will your organization obtain consent in a metaverse?

Unlike the EU’s General Data Protection Regulation or similar international privacy laws, the only legal basis for processing personal information under Canadian data protection laws is individual consent. This remains true even for the recently amended Quebec Act respecting the protection of personal information in the private sector and in the federal government’s proposed Consumer Privacy Protection Act (CPPA) which, if Bill C-27 is passed, will replace PIPEDA.

Canadian data protection laws provide that consent is only valid if it is reasonable to assume that the individual understands the nature, purpose and consequence of the collection, use or disclosure of personal information to which they are consenting. The reasonable expectations of an individual and the sensitivity of the information are relevant to the nature of consent required. Further, an organization cannot tie the provision of products or services to a requirement to consent to the collection, use or disclosure of personal information, except to the extent that the collection, use or disclosure of personal information is necessary to provide the requested products or services. Additionally, individuals must be able to withdraw their consent at any time, subject to contractual and legal restrictions. 

Obtaining meaningful consent in a metaverse may be challenging. It is likely that only a few organizations will emerge with popular metaverse platforms. These metaverses could emerge as new types of “walled gardens”.

Before joining a metaverse, organizations that will be collecting, using and disclosing personal information in the new virtual world should consider carefully what tools these platform operators will provide to bring required consent notices to an individual’s attention to ensure it is viewed before the processing activities occur. To the extent that organizations will rely on the metaverse platform providers to present notices or capture consent, they should ensure that their agreements with the platform outline any consent obligations, including who is responsible for obtaining consent and ensure that the consent aligns with the parties’ roles (“controller” versus “processor”) and data uses. Organizations should also undertake meaningful diligence on the underlying data flows and processes, for example, by requesting consent flow charts.

How will your organization limit the collection of personal information?

In a fully virtual world, there could be endless streams of data available for an organization to process. For example, if an organization hosts a virtual conference, it could theoretically obtain data about all participant movements, interactions, conversations, and feedback – much richer data than a physical conference might generate.

Under Canadian privacy laws, organizations can only collect personal information needed to fulfill a legitimate identified purpose and can only collect personal information by fair and lawful means. Importantly, organizations should be aware that collecting less information reduces the risk of loss or inappropriate access, use or disclosure of personal information.

To ensure compliance with applicable Canadian data protection laws, organizations participating in a metaverse need to understand how data collection and processing will work, including data usage rights the platform itself could have vis-à-vis the data collected on the organization’s behalf. This can be accomplished through robust contractual controls and an agile data management system that ensures the organization identifies all types of personal information collected directly and indirectly from individuals.

Further, organizations should be sure that their data subject request management systems adequately address any metaverse offerings.

Can your organization protect itself against additional risks associated with collecting personal information?

It is difficult to ascertain all potential risks of joining a metaverse without considering the agreements that a metaverse platform may require commercial entities to enter into to offer services in these virtual worlds. Organizations should address whether the potential interactions they can have with customers on a metaverse platform are worth the additional exposures associated with the collection of personal information.

Quebec’s recently amended Act respecting the protection of personal information in the private sector will soon provide for penalties of up to C$25-million or, if greater, an amount corresponding to 4% of the organization’s worldwide turnover for the preceding fiscal year. The proposed CPPA would, if passed, include similarly hefty fines for non-compliance.
It should be noted that in a metaverse, it might be difficult to ascertain where an interaction is occurring. Given the potential for significant fines under international data protection laws, organizations should think carefully about the various international legal regimes they are exposing themselves to by participating in a metaverse.

For more information, please contact:

Ellie Marshall              +1-416-863-3053

or any other member of our Privacy & Data Protection group.