WebXR Mesh Detection Module

WebXR Mesh Detection Module

Draft Community Group Report,

More details about this document
This version:
https://github.com/immersive-web/real-world-meshing/
Issue Tracking:
GitHub
Editor:
(Meta)
Participate:
File an issue (open issues)
Mailing list archive
W3C’s #immersive-web IRC

Abstract

Status of this document

1. Introduction

2. Initialization

2.1. Feature descriptor

In order for the applications to signal their interest in using mesh detection during a session, the session must be requested with appropriate feature descriptor. The string mesh-detection is introduced by this module as a new valid feature descriptor for mesh detection feature.

A device is capable of supporting the mesh-detection feature if the device’s tracking system exposes a native mesh detection capability. The inline XR device MUST NOT be treated as capable of supporting the mesh-detection feature.

When a session is created with mesh-detection feature enabled, the update meshes algorithm MUST be added to the list of frame updates of that session.

The following code demonstrates how a session that requires mesh detection could be requested:
const session = await navigator.xr.requestSession("immersive-ar", {
  requiredFeatures: ["mesh-detection"]
});

3. Meshs

3.1. XRMesh

[Exposed=Window] interface XRMesh {
    [SameObject] readonly attribute XRSpace meshSpace;

    readonly attribute FrozenArray<Float32Array> vertices;
    readonly attribute Uint32Array indices;
    readonly attribute DOMHighResTimeStamp lastChangedTime;
    readonly attribute DOMString? semanticLabel;
};

An XRMesh represents a single instance of 3D geometry detected by the underlying XR system.

The meshSpace is an XRSpace that establishes the coordinate system of the mesh. The native origin of the meshSpace tracks mesh’s center. The underlying XR system defines the exact meaning of the mesh center. The Y axis of the coordinate system defined by meshSpace MUST represent the mesh’s normal vector.

Each XRMesh has an associated native entity.

Each XRMesh has an associated frame.

The vertices is an array of vertices that describe the shape of the mesh. They are expressed in the coordinate system defined by meshSpace.

The indices is an array of indices that describe the index of each vertex of the mesh.

The lastChangedTime is the last time some of the mesh attributes have been changed.

Note: The pose of a mesh is not considered a mesh attribute and therefore updates to mesh pose will not cause the lastChangedTime to change. This is because mesh pose is a property that is derived from two different entities - meshSpace and the XRSpace relative to which the pose is to be computed via getPose() function.

4. Obtaining detected meshes

4.1. XRMeshSet

[Exposed=Window] interface XRMeshSet {
  readonly setlike<XRMesh>;
};

An XRMeshSet is a collection of XRMeshs. It is the primary mechanism of obtaining the collection of meshes detected in an XRFrame.

partial interface XRFrame {
  readonly attribute XRMeshSet detectedMeshes;
};

XRFrame is extended to contain detectedMeshes attribute which contains all meshes that are still tracked in the frame. The set is initially empty and will be populated by the update meshes algorithm. If this attribute is accessed when the frame is not active, the user agent MUST throw InvalidStateError.

XRSession is also extended to contain associated set of tracked meshes, which is initially empty. The elements of the set will be of XRMesh type.

In order to update meshes for frame, the user agent MUST run the following steps:
  1. Let session be a frame’s session.

  2. Let device be a session’s XR device.

  3. Let trackedMeshs be a result of calling into device’s native mesh detection capability to obtain tracked meshes at frame’s time.

  4. For each native mesh in trackedMeshs, run:

    1. If desired, treat the native mesh as if it were not present in trackedMeshs and continue to the next entry. See § 6 Privacy & Security Considerations for criteria that could be used to determine whether an entry should be ignored in this way.

    2. If session’s set of tracked meshes contains an object mesh that corresponds to native mesh, invoke update mesh object algorithm with mesh, native mesh, and frame, and continue to the next entry.

    3. Let mesh be the result of invoking the create mesh object algorithm with native mesh and frame.

    4. Add mesh to session’s set of tracked meshes.

  5. Remove each object in session’s set of tracked meshes that was neither created nor updated during the invocation of this algorithm.

  6. Set frame’s detectedMeshes to set of tracked meshes.

In order to create mesh object from a native mesh object native mesh and XRFrame frame, the user agent MUST run the following steps:
  1. Let result be a new instance of XRMesh.

  2. Set result’s native entity to native mesh.

  3. Set result’s meshSpace to a new XRSpace object created with session set to frame’s session and native origin set to track native mesh’s native origin.

  4. Invoke update mesh object algorithm with result, native mesh, and frame.

  5. Return result.

A mesh object, result, created in such way is said to correspond to the passed in native mesh object native mesh.

In order to update mesh object mesh from a native mesh object native mesh and XRFrame frame, the user agent MUST run the following steps:
  1. Set mesh’s frame to frame.

  2. Set mesh’s vertices to the new array of vertices representing native mesh’s vertices, performing all necessary conversions to account for differences in native mesh representation.

  3. Set mesh’s indices to the new array of indices representing native mesh’s vertices, performing all necessary conversions to account for differences in native mesh representation.

  4. If desired, reduce the level of detail of the mesh’s vertices and indices as described in § 6 Privacy & Security Considerations.

  5. Set mesh’s lastChangedTime to time.

5. Native device concepts

5.1. Native mesh detection

The mesh detection API provides information about 3D surfaces detected in users' environment. It is assumed in this specification that user agents can rely on native mesh detection capabilities provided by the underlying platform for their implementation of mesh-detection features. Specifically, the underlying XR device should provide a way to query all meshes that are tracked at a time that corresponds to the timeof a specific XRFrame.

Moreover, it is assumed that the tracked meshes, known as native mesh objects, maintain their identity across frames - that is, given a mesh object P returned by the underlying system at time t0, and a mesh object Q returned by the underlying system at time t1, it is possible for the user agent to query the underlying system about whether P and Q correspond to the same logical mesh object. The underlying system is also expected to provide a native origin that can be used to query the location of a pose at time t, although it is not guaranteed that mesh pose will always be known (for example for meshes that are still tracked but not localizable at a given time). In addition, the native mesh object should expose a polygon describing approximate shape of the detected mesh.

In addition, the underlying system should recognize native meshes as native entities for the purposes of XRAnchor creation. For more information, see WebXR Anchors Module § native-anchor section.

6. Privacy & Security Considerations

The mesh detection API exposes information about users' physical environment. The exposed mesh information (such as mesh’s polygon) may be limited if the user agent so chooses. Some of the ways in which the user agent can reduce the exposed information are: decreasing the level of detail of the mesh’s polygon in update mesh object algorithm (for example by decreasing the number of vertices, or by rounding / quantizing the coordinates of the vertices), or removing the mesh altogether by behaving as if the mesh object was not present in trackedMeshs collection in update meshes algorithm (this could be done for example if the detected mesh is deemed to small / too detailed to be surfaced and the mechanisms to reduce details exposed on meshes are not implemented by the user agent). The poses of the meshes (obtainable from meshSpace) could also be quantized.

Since concepts from mesh detection API can be used in methods exposed by [webxr-anchors-module] specification, some of the privacy & security considerations that are relevant to WebXR Anchors Module also apply here. For details, see WebXR Anchors Module § privacy-security section.

Due to how mesh detection API extends WebXR Device API, the section WebXR Device API §  13. Security, Privacy, and Comfort Considerations is also applicable to the features exposed by the WebXR Mesh Detection Module.

7. Acknowledgements

The following individuals have contributed to the design of the WebXR Mesh Detection specification:

Conformance

Document conventions

Conformance requirements are expressed with a combination of descriptive assertions and RFC 2119 terminology. The key words “MUST”, “MUST NOT”, “REQUIRED”, “SHALL”, “SHALL NOT”, “SHOULD”, “SHOULD NOT”, “RECOMMENDED”, “MAY”, and “OPTIONAL” in the normative parts of this document are to be interpreted as described in RFC 2119. However, for readability, these words do not appear in all uppercase letters in this specification.

All of the text of this specification is normative except sections explicitly marked as non-normative, examples, and notes. [RFC2119]

Examples in this specification are introduced with the words “for example” or are set apart from the normative text with class="example", like this:

This is an example of an informative example.

Informative notes begin with the word “Note” and are set apart from the normative text with class="note", like this:

Note, this is an informative note.

Index

Terms defined by this specification

Terms defined by reference

References

Normative References

[HR-TIME-3]
Yoav Weiss. High Resolution Time. URL: https://w3c.github.io/hr-time/
[RFC2119]
S. Bradner. Key words for use in RFCs to Indicate Requirement Levels. March 1997. Best Current Practice. URL: https://datatracker.ietf.org/doc/html/rfc2119
[WEBIDL]
Edgar Chen; Timothy Gu. Web IDL Standard. Living Standard. URL: https://webidl.spec.whatwg.org/
[WEBXR]
Brandon Jones; Manish Goregaokar; Rik Cabanier. WebXR Device API. URL: https://immersive-web.github.io/webxr/

Informative References

[WEBXR-ANCHORS-MODULE]
Piotr Bialecki. WebXR Anchors Module. DR. URL: https://immersive-web.github.io/anchors/

IDL Index

[Exposed=Window] interface XRMesh {
    [SameObject] readonly attribute XRSpace meshSpace;

    readonly attribute FrozenArray<Float32Array> vertices;
    readonly attribute Uint32Array indices;
    readonly attribute DOMHighResTimeStamp lastChangedTime;
    readonly attribute DOMString? semanticLabel;
};

[Exposed=Window] interface XRMeshSet {
  readonly setlike<XRMesh>;
};

partial interface XRFrame {
  readonly attribute XRMeshSet detectedMeshes;
};